131 research outputs found

    Chinese vessel pollution prevention in the era of the green economy

    Get PDF

    Prompt-Based Metric Learning for Few-Shot NER

    Full text link
    Few-shot named entity recognition (NER) targets generalizing to unseen labels and/or domains with few labeled examples. Existing metric learning methods compute token-level similarities between query and support sets, but are not able to fully incorporate label semantics into modeling. To address this issue, we propose a simple method to largely improve metric learning for NER: 1) multiple prompt schemas are designed to enhance label semantics; 2) we propose a novel architecture to effectively combine multiple prompt-based representations. Empirically, our method achieves new state-of-the-art (SOTA) results under 16 of the 18 considered settings, substantially outperforming the previous SOTA by an average of 8.84% and a maximum of 34.51% in relative gains of micro F1. Our code is available at https://github.com/AChen-qaq/ProML

    Zero-Label Prompt Selection

    Full text link
    Natural language prompts have been shown to facilitate cross-task generalization for large language models. However, with no or limited labeled examples, the cross-task performance is highly sensitive to the choice of prompts, while selecting a high-performing prompt is challenging given the scarcity of labels. To address the issue, we propose a Zero-Label Prompt Selection (ZPS) method that selects prompts without any labeled data or gradient update. Specifically, given the candidate human-written prompts for a task, ZPS labels a set of unlabeled data with a prompt ensemble and uses the pseudo-labels for prompt selection. Experiments show that ZPS improves over prior methods by a sizeable margin in zero-label performance. We also extend ZPS to a few-shot setting and show its advantages over strong baselines such as prompt tuning and model tuning

    NLP From Scratch Without Large-Scale Pretraining: A Simple and Efficient Framework

    Full text link
    Pretrained language models have become the standard approach for many NLP tasks due to strong performance, but they are very expensive to train. We propose a simple and efficient learning framework, TLM, that does not rely on large-scale pretraining. Given some labeled task data and a large general corpus, TLM uses task data as queries to retrieve a tiny subset of the general corpus and jointly optimizes the task objective and the language modeling objective from scratch. On eight classification datasets in four domains, TLM achieves results better than or similar to pretrained language models (e.g., RoBERTa-Large) while reducing the training FLOPs by two orders of magnitude. With high accuracy and efficiency, we hope TLM will contribute to democratizing NLP and expediting its development.Comment: 14 pages, 5 figure

    An Ensemble Method to Predict Student Performance in an Online Math Learning Environment

    Get PDF
    ABSTRACT The number of e-learning platforms and blended learning environments is continuously increasing and has sparked a lot of research around improvements of educational processes. Here, the ability to accurately predict student performance plays a vital role. Previous studies commonly focused on the construction of predictors tailored to a formal course. In this paper we relax this constraint, leveraging domain knowledge and combining a knowledge graph representation with activity scopes based on sets of didactically feasible learning objectives. Specialized scope classifiers are then combined to an ensemble to robustly predict student performance on learning objectives independently of the student's individual learning setting. The final ensemble's accuracy trumps any single classifier tested

    A New Behavior of Nuclei during Mitosis of Lilium Hybrids

    Get PDF
    Mitosis is nuclear division plus cytokinesis,and produces two identical daughter cells during prophase, prometaphase, metaphase, anaphase, and telophase. However, a new nucleus behavior in interspecific hybrid progenies of Lilium was observed in our experiment. Very unusual behaviors of nuclei surprisingly presented during the mitosis, such as sprouting or germination, tube-like elongation, penetrating cell membrane into a neighbor cell, the top of nuclei tube expanding, intruding and splitting of the tube-like nucleus, and micronucleus formation, and so on. Furthermore, the tetrad of meiosis was founded in mitosis of root.

Routine of the unusual nucleus behaviors observed in our experiment may be summarized as nucleus germination¬— tube-like elongation— penetrating cell membrane— entering a neighbor cell—the top of nuclei tube expanding—tube ingression and splitting— formation of a new nucleus or micronucleus.

Many kinds of abnormal mitosis caused by chemical and physical induction such as unequal division, chromosome bridges, lagging chromosomes, and multiple nuclei have resulted in variations of chromosome number and structure. However, this new nucleus behavior is firstly reported, these phenomena implied that the DNA maybe easily emigrates from one cell to another. Therefore, the unusual behaviors of nuclei in hybrid progenies of Lilium not only create mutations for breeding of new cultivars, also produce possibly ideal materials for exotic DNA or gene transfication with simple method in meristem. This mode of nuclei behaviors is a new addition to cytogenetics of plant of vegetative propagation and provide a new genetic mechanism of species evolution from interspecific hybridization
    • …
    corecore